Goto

Collaborating Authors

 final manuscript


We thank all the reviewers for excellent questions and many relevant remarks

Neural Information Processing Systems

We thank all the reviewers for excellent questions and many relevant remarks. Thank you for this remark. One of the reason for this is that our method produces interpretations directly in terms of the input features. Thank you for pointing this out, we agree that faithful is not best. This is not the case for local models such as LIME.






The key contribution of our work is the development of an efficient and memory-2

Neural Information Processing Systems

We thank all the reviewers for their constructive comments. Please see our responses below. Our approach is purely based on 2D convolutions. We thank the reviewers for pointing out some related (or missing) references. Timeception, SlowFast and TSM are concurrent with our work.


Generalization in multitask deep neural classifiers: a statistical physics approach

Neural Information Processing Systems

We would first like to thank all three reviewers for their thorough, constructive and considered reviews. Appendix A, our model is a nonequilibrium variant of Derrida's Random Energy Model. We will update the final manuscript to describe this analogy more explicitly. As such, this is still a matter of active research. Conditions claimed in L181-184: We will amend the manuscript to indicate that the equation directly preceding eqn.



. We thank Reviewer # 1 for pointing out this interesting work. Both our

Neural Information Processing Systems

We thank all reviewers for taking the time to provide detailed feedback and valuable suggestions for our work. However, their PDFs' exact expressions are in fact different. Alternatively, one can also use the No U-Turn Sampler (NUTS) implemented in Stan. As shown in equation (5) of the main text, BNE's mean function consists of the In the experiment, BNE is by construction more expressive than BAE. Figures 4-5 suggest that the former is true, but not the latter.